{ "cells": [ { "cell_type": "markdown", "id": "745f3054-38dc-4fec-9dec-893dbf98a4c8", "metadata": {}, "source": [ "# Tutorial: Event based logging output\n", "In this notebook we show how to enable and use the event based logging, which is a set of standardised outputs from `binary_c` for events like supernovae and Roche-lobe overflow episodes. \n", "\n", "The events that are available in this version of `binary_c-python` are listed on [the events based logging page in the documentation](https://binary_c.gitlab.io/binary_c-python/event_based_logging_descriptions.html).\n", "\n", "The relevant options for this functionality are prepended with `event_based_logging_` (see the [population options documentation](https://binary_c.gitlab.io/binary_c-python/population_options_descriptions.html) and the [binary_c options documentation](https://binary_c.gitlab.io/binary_c-python/binary_c_parameters.html))\n", "\n", "The events are all flagged with a uuid that allows for matching events that come from the same system. This is useful for example to track the events that preceded the formation of double compact objects, so we can study their evolutionary path." ] }, { "cell_type": "markdown", "id": "ea335f84-992b-492c-a354-10ba78e0f200", "metadata": { "tags": [] }, "source": [ "## Enabling event-based logging in `binary_c`\n", "To enable the output of the certain events, we need to configure one of the following commands:\n", "\n", "- `event_based_logging_SN=1`: Enables supernova event logging\n", "- `event_based_logging_RLOF=1`: Enables RLOF event logging\n", "- `event_based_logging_DCO=1`: Enables double compact-object event logging\n" ] }, { "cell_type": "markdown", "id": "6f3e8475-5381-4c79-944d-c6702cff0d4f", "metadata": { "tags": [] }, "source": [ "## Enabling event-based log handling in `binary_c-python`\n", "To enable the automatic processing of the event logs with `binary_c-python`, we need to set `event_based_logging_handle_output=1`\n", "\n", "This allows `binary_c-python` to automatically process the output of binary_c and create output files that contain the event logs.\n", "\n", "There are several options related to the processing of these output files:\n", " - `event_based_logging_output_directory`: directory where the event based logs will be written to.\n", " - `event_based_logging_combine_individual_event_files`: whether to automatically combine all the process-specific event log files into one combined file.\n", " - `event_based_logging_combined_events_filename`: filename of the combined event file.\n", "- `event_based_logging_remove_individual_event_files_after_combining`: whether to remove the process-specific event log files after combining.\n", "- `event_based_logging_split_events_file_to_each_type`: whether to split the combined event file into event-specific files. \n", "- `event_based_logging_remove_original_combined_events_file_after_splitting`: whether to remove the combined event file after splitting.\n", "- `event_based_logging_output_separator`: separator used for writing the events.\n", "- `event_based_logging_output_parser`: parsing function to handle the output of binary_c. There is a function already present for this, so no need to provide this yourself unless you have special requests.\n", "- `event_based_logging_parameter_list_dict`: dictionary that contains the parameter name list for each specific event (see [the events based logging page in the documentation](https://binary_c.gitlab.io/binary_c-python/event_based_logging_descriptions.html)). The current present dictionary is designed to handle the events that are present in this release, but if you add your own events you need to update this dictionary or provide a custom one.\n" ] }, { "cell_type": "markdown", "id": "4a0bc9b2-c976-4688-9b2d-9e2d49b17776", "metadata": { "tags": [] }, "source": [ "## Example usage\n", "We now show some example usage of the event-based logging. We start as usual with some imports and setting up a population object" ] }, { "cell_type": "code", "execution_count": 1, "id": "cda61339-be72-4b61-b307-598b66fdfa47", "metadata": {}, "outputs": [], "source": [ "import os\n", "import json\n", "\n", "from binarycpython.utils.custom_logging_functions import temp_dir\n", "from binarycpython import Population\n", "\n", "TMP_DIR = temp_dir(\"notebooks\", \"notebook_events_based_logging\", clean_path=True)\n", "EVENT_TYPE_INDEX = 3\n", "\n", "data_dir = os.path.join(TMP_DIR, 'data_dir')\n", "os.makedirs(data_dir, exist_ok=True)\n", "\n", "event_based_logging_population = Population(tmp_dir=TMP_DIR, verbosity=2)" ] }, { "cell_type": "markdown", "id": "75cd9bcf-d480-441c-aba1-21ac6c5d697c", "metadata": {}, "source": [ "Lets configure the population object to use events based logging" ] }, { "cell_type": "code", "execution_count": 2, "id": "97382213-2751-4841-a8cd-f0e9e543af7e", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "adding: num_cores=2 to population_options\n", "adding: event_based_logging_SN=1 to BSE_options by catching the %d\n", "adding: event_based_logging_RLOF=1 to BSE_options by catching the %d\n", "adding: event_based_logging_DCO=1 to BSE_options by catching the %d\n", "adding: event_based_logging_handle_output=True to population_options\n", "adding: event_based_logging_output_directory=/tmp/binary_c_python-david/notebooks/notebook_events_based_logging/events to population_options\n", "adding: event_based_logging_combine_individual_event_files=True to population_options\n", "adding: event_based_logging_remove_individual_event_files_after_combining=False to population_options\n", "adding: event_based_logging_split_events_file_to_each_type=False to population_options\n", "adding: event_based_logging_remove_original_combined_events_file_after_splitting=False to population_options\n" ] } ], "source": [ "# Set population object\n", "event_based_logging_population.set(\n", " num_cores=2,\n", " # data_dir=data_dir,\n", " # binary-c options related to event-based logging\n", " event_based_logging_SN=1,\n", " event_based_logging_RLOF=1,\n", " event_based_logging_DCO=1,\n", " # binary_c-python options related to event-based logging\n", " event_based_logging_handle_output=True,\n", " event_based_logging_output_directory=os.path.join(TMP_DIR, 'events'),\n", " event_based_logging_combine_individual_event_files=True,\n", " event_based_logging_remove_individual_event_files_after_combining=False,\n", " event_based_logging_split_events_file_to_each_type=False,\n", " event_based_logging_remove_original_combined_events_file_after_splitting=False,\n", ")" ] }, { "cell_type": "markdown", "id": "25f02c5f-850e-4259-aa13-a1613c323b58", "metadata": {}, "source": [ "And lets provide some systems that can generate us some events. We use a set list of systems through the source-file sampling functionality but that is only for this example. For a more serious sampling you can use e.g. the grid-based sampling (see [grid-based sampling notebook](https://binary_c.gitlab.io/binary_c-python/examples/notebook_population.html)) " ] }, { "cell_type": "code", "execution_count": 3, "id": "34d49455-9b34-4be0-8d64-c6e70cd3e1d4", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "adding: source_file_sampling_type=command to population_options\n", "adding: source_file_sampling_filename=/tmp/binary_c_python-david/notebooks/notebook_events_based_logging/source_file_sampling_filename.txt to population_options\n", "adding: evolution_type=source_file to population_options\n" ] } ], "source": [ "# Configure the source-file sampling \n", "system_dict_test_list = [\n", " {\"M_1\": 10},\n", " {\"M_1\": 10.0, \"M_2\": 0.1, \"orbital_period\": 1000000000},\n", " {\"M_1\": 1, \"M_2\": 0.5, \"orbital_period\": 100.0},\n", "]\n", "\n", "# Create file that contains the systems\n", "source_file_sampling_filename = os.path.join(\n", " TMP_DIR, \"source_file_sampling_filename.txt\"\n", ")\n", "\n", "# write the source file\n", "with open(source_file_sampling_filename, \"w\") as f:\n", " # Loop over system dict\n", " for system_dict_test_entry in system_dict_test_list:\n", " argline = \" \".join(\n", " [\n", " \"{} {}\".format(key, val)\n", " for key, val in system_dict_test_entry.items()\n", " ]\n", " )\n", " f.write(argline + \"\\n\")\n", "\n", "# Update setting\n", "event_based_logging_population.set(\n", " source_file_sampling_type=\"command\",\n", " source_file_sampling_filename=source_file_sampling_filename,\n", " evolution_type=\"source_file\"\n", ")" ] }, { "cell_type": "markdown", "id": "8d3b08d8-6f5e-4482-a494-147cd5db3f93", "metadata": {}, "source": [ "Lets now run these systems and explore the contents of the event files." ] }, { "cell_type": "code", "execution_count": 4, "id": "d0939590-c5b2-4c3e-b246-efee4ea87c16", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Warning: No parse function set. Make sure you intended to do this.\n", "setting up the system_queue_filler now\n", "Loading source file from /tmp/binary_c_python-david/notebooks/notebook_events_based_logging/source_file_sampling_filename.txt\n", "Source file loaded\n", "Signalling processes to stop\n", "\n", "****************************************************\n", "* Process 1 finished: *\n", "* generator started at 2023-05-19T11:46:05.171515 *\n", "* generator finished at 2023-05-19T11:46:05.304666 *\n", "* total: 0.13s *\n", "* of which 0.07s with binary_c *\n", "* Ran 1 systems *\n", "* with a total probability of 1 *\n", "* This thread had 0 failing systems *\n", "* with a total failed probability of 0 *\n", "* Skipped a total of 0 zero-probability systems *\n", "* *\n", "****************************************************\n", "\n", "\n", "****************************************************\n", "* Process 0 finished: *\n", "* generator started at 2023-05-19T11:46:05.167359 *\n", "* generator finished at 2023-05-19T11:46:05.363291 *\n", "* total: 0.20s *\n", "* of which 0.14s with binary_c *\n", "* Ran 2 systems *\n", "* with a total probability of 2 *\n", "* This thread had 0 failing systems *\n", "* with a total failed probability of 0 *\n", "* Skipped a total of 0 zero-probability systems *\n", "* *\n", "****************************************************\n", "\n", "\n", "**********************************************************\n", "* Population-4d2bf0d253dc4fce92d16ec4f79f1d58 finished! *\n", "* The total probability is 3. *\n", "* It took a total of 0.59s to run 3 systems on 2 cores *\n", "* = 1.19s of CPU time. *\n", "* Maximum memory use 337.516 MB *\n", "**********************************************************\n", "\n", "No failed systems were found in this run.\n" ] }, { "data": { "text/plain": [ "{'population_id': '4d2bf0d253dc4fce92d16ec4f79f1d58',\n", " 'evolution_type': 'source_file',\n", " 'failed_count': 0,\n", " 'failed_prob': 0,\n", " 'failed_systems_error_codes': [],\n", " 'errors_exceeded': False,\n", " 'errors_found': False,\n", " 'total_probability': 3,\n", " 'total_count': 3,\n", " 'start_timestamp': 1684493165.101546,\n", " 'end_timestamp': 1684493165.6958666,\n", " 'time_elapsed': 0.59432053565979,\n", " 'total_mass_run': 21.6,\n", " 'total_probability_weighted_mass_run': 21.6,\n", " 'zero_prob_stars_skipped': 0}" ] }, "execution_count": 4, "metadata": {}, "output_type": "execute_result" } ], "source": [ "event_based_logging_population.evolve()" ] }, { "cell_type": "code", "execution_count": 5, "id": "37a47d9b-3b71-4d65-89f9-f76797558b0f", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "/tmp/binary_c_python-david/notebooks/notebook_events_based_logging/events/all_events.dat\n", "4FACA930-D60C-45B2-876C-8FDC4E8464F5\t1\t0\tSN_BINARY\t10\t0.1\t1e+09\t9.09806e+06\t0\t2.848420259305e+01\t0.02\t59335\t1.33487\t13\t14\t0\t0\t-1\t3.23995e+06\t-1.67576\t9.18507\t5\t727.198\t2.81957\t1.82729\t2.81957\t0.00534839\t0\t3.23834e+06\t9.89406e+06\t0.1\t0.134597\t0\t0\t1\t404.39\t4.31965\t-0.210048\n", "\n", "6CF84E95-DE5F-4884-927D-CD07208DED8F\t1\t0\tSN_SINGLE\t10\t2.848380621878e+01\t0.02\t22227\t1.33469\t13\t14\t0\t0\t9.1865\t5\t724.338\t2.81957\t1.82533\t2.81957\t0.00539487\t0\t1\t438.865\t2.75425\t0.306326\n", "\n", "BE487565-5C4B-4DB8-9D67-61E1E2436CA1\t1\t0\tRLOF\t1\t0.5\t100\t103.802\t0\t1.231494440429e+04\t0.02\t21341\t0.970542\t0.50131\t41.8386\t0.468712\t95.7159\t0.244766\t3\t0\t77741.6\t3\t1\t0\t1.231494317672e+10\t0\t0.337181\t0.50131\t0.016932\t0.468712\t2.79234\t0.00161588\t10\t0\t6111.9\t3\t0\t1\t1.231494440429e+10\t0\t0.633361\t0\t0.50131\t0\t0.633361\t202.734\t1\n", "\n" ] } ], "source": [ "combined_events_filename = os.path.join(\n", " event_based_logging_population.population_options['event_based_logging_output_directory'],\n", " event_based_logging_population.population_options['event_based_logging_combined_events_filename']\n", ")\n", "print(combined_events_filename)\n", "with open(combined_events_filename, 'r') as f:\n", " combined_events = f.readlines()\n", " \n", "for event in combined_events:\n", " print(event)" ] }, { "cell_type": "markdown", "id": "5e0b148f-4460-441b-9039-be9c9b25c8d2", "metadata": {}, "source": [ "As we see above, we now have some events that we can analyse. \n", "\n", "We can parse the contents of each of the events with the `event_based_logging_parameter_list_dict`." ] }, { "cell_type": "code", "execution_count": 6, "id": "9504ed65-4fa1-4062-ac42-720ad8207d12", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "{\n", " \"uuid\": \"4FACA930-D60C-45B2-876C-8FDC4E8464F5\",\n", " \"probability\": 1,\n", " \"event_number\": 0,\n", " \"event_type\": \"SN_BINARY\",\n", " \"zams_mass_1\": 10,\n", " \"zams_mass_2\": 0.1,\n", " \"zams_orbital_period\": 1000000000.0,\n", " \"zams_separation\": 9098060.0,\n", " \"zams_eccentricity\": 0,\n", " \"time\": 28.48420259305,\n", " \"metallicity\": 0.02,\n", " \"random_seed\": 59335,\n", " \"SN_post_SN_mass\": 1.33487,\n", " \"SN_post_SN_stellar_type\": 13,\n", " \"SN_type\": 14,\n", " \"SN_fallback_fraction\": 0,\n", " \"SN_fallback_mass\": 0,\n", " \"SN_post_SN_ecc\": -1,\n", " \"SN_post_SN_orbital_period\": 3239950.0,\n", " \"SN_post_SN_separation\": -1.67576,\n", " \"SN_pre_SN_mass\": 9.18507,\n", " \"SN_pre_SN_stellar_type\": 5,\n", " \"SN_pre_SN_radius\": 727.198,\n", " \"SN_pre_SN_core_mass\": 2.81957,\n", " \"SN_pre_SN_CO_core_mass\": 1.82729,\n", " \"SN_pre_SN_He_core_mass\": 2.81957,\n", " \"SN_pre_SN_fraction_omega_crit\": 0.00534839,\n", " \"SN_pre_SN_ecc\": 0,\n", " \"SN_pre_SN_orbital_period\": 3238340.0,\n", " \"SN_pre_SN_separation\": 9894060.0,\n", " \"SN_pre_SN_companion_mass\": 0.1,\n", " \"SN_pre_SN_companion_radius\": 0.134597,\n", " \"SN_pre_SN_companion_stellar_type\": 0,\n", " \"SN_starnum\": 0,\n", " \"SN_counter\": 1,\n", " \"SN_kick_v\": 404.39,\n", " \"SN_kick_omega\": 4.31965,\n", " \"SN_kick_phi\": -0.210048\n", "}\n", "{\n", " \"uuid\": \"6CF84E95-DE5F-4884-927D-CD07208DED8F\",\n", " \"probability\": 1,\n", " \"event_number\": 0,\n", " \"event_type\": \"SN_SINGLE\",\n", " \"zams_mass_1\": 10,\n", " \"time\": 28.48380621878,\n", " \"metallicity\": 0.02,\n", " \"random_seed\": 22227,\n", " \"SN_post_SN_mass\": 1.33469,\n", " \"SN_post_SN_stellar_type\": 13,\n", " \"SN_type\": 14,\n", " \"SN_fallback_fraction\": 0,\n", " \"SN_fallback_mass\": 0,\n", " \"SN_pre_SN_mass\": 9.1865,\n", " \"SN_pre_SN_stellar_type\": 5,\n", " \"SN_pre_SN_radius\": 724.338,\n", " \"SN_pre_SN_core_mass\": 2.81957,\n", " \"SN_pre_SN_CO_core_mass\": 1.82533,\n", " \"SN_pre_SN_He_core_mass\": 2.81957,\n", " \"SN_pre_SN_fraction_omega_crit\": 0.00539487,\n", " \"SN_starnum\": 0,\n", " \"SN_counter\": 1,\n", " \"SN_kick_v\": 438.865,\n", " \"SN_kick_omega\": 2.75425,\n", " \"SN_kick_phi\": 0.306326\n", "}\n", "{\n", " \"uuid\": \"BE487565-5C4B-4DB8-9D67-61E1E2436CA1\",\n", " \"probability\": 1,\n", " \"event_number\": 0,\n", " \"event_type\": \"RLOF\",\n", " \"zams_mass_1\": 1,\n", " \"zams_mass_2\": 0.5,\n", " \"zams_orbital_period\": 100,\n", " \"zams_separation\": 103.802,\n", " \"zams_eccentricity\": 0,\n", " \"time\": 12314.94440429,\n", " \"metallicity\": 0.02,\n", " \"random_seed\": 21341,\n", " \"RLOF_initial_mass_accretor\": 0.970542,\n", " \"RLOF_initial_mass_donor\": 0.50131,\n", " \"RLOF_initial_radius_accretor\": 41.8386,\n", " \"RLOF_initial_radius_donor\": 0.468712,\n", " \"RLOF_initial_separation\": 95.7159,\n", " \"RLOF_initial_orbital_period\": 0.244766,\n", " \"RLOF_initial_stellar_type_accretor\": 3,\n", " \"RLOF_initial_stellar_type_donor\": 0,\n", " \"RLOF_initial_orbital_angular_momentum\": 77741.6,\n", " \"RLOF_initial_stability\": 3,\n", " \"RLOF_initial_starnum_accretor\": 1,\n", " \"RLOF_initial_starnum_donor\": 0,\n", " \"RLOF_initial_time\": 12314943176.72,\n", " \"RLOF_initial_disk\": 0,\n", " \"RLOF_final_mass_accretor\": 0.337181,\n", " \"RLOF_final_mass_donor\": 0.50131,\n", " \"RLOF_final_radius_accretor\": 0.016932,\n", " \"RLOF_final_radius_donor\": 0.468712,\n", " \"RLOF_final_separation\": 2.79234,\n", " \"RLOF_final_orbital_period\": 0.00161588,\n", " \"RLOF_final_stellar_type_accretor\": 10,\n", " \"RLOF_final_stellar_type_donor\": 0,\n", " \"RLOF_final_orbital_angular_momentum\": 6111.9,\n", " \"RLOF_final_stability\": 3,\n", " \"RLOF_final_starnum_accretor\": 0,\n", " \"RLOF_final_starnum_donor\": 1,\n", " \"RLOF_final_time\": 12314944404.29,\n", " \"RLOF_final_disk\": 0,\n", " \"RLOF_total_mass_lost\": 0.633361,\n", " \"RLOF_total_mass_accreted\": 0,\n", " \"RLOF_total_mass_transferred\": 0.50131,\n", " \"RLOF_total_mass_lost_from_accretor\": 0,\n", " \"RLOF_total_mass_lost_from_common_envelope\": 0.633361,\n", " \"RLOF_total_time_spent_masstransfer\": 202.734,\n", " \"RLOF_episode_number\": 1\n", "}\n" ] } ], "source": [ "def recast_values(event_dict):\n", " \"\"\"\n", " Function to recast the values from strings to number values\n", " \"\"\"\n", " \n", " for key in event_dict.keys():\n", " if key in ['uuid', 'event_type']:\n", " continue\n", "\n", " try:\n", " if '.' in event_dict[key]:\n", " event_dict[key] = float(event_dict[key])\n", " else:\n", " event_dict[key] = int(event_dict[key])\n", " except:\n", " event_dict[key] = float(event_dict[key])\n", " \n", " return event_dict\n", "\n", "def parse_events(events_list, parsing_dict):\n", " \"\"\"\n", " Function to parse the output of the evolution of the system and create a dictionary containing the \n", " \"\"\"\n", " \n", " parsed_events_list = []\n", " \n", " # Loop over output\n", " for event in events_list:\n", " split_event = event.split()\n", "\n", " # Parse output and create dictionary\n", " event_type = split_event[EVENT_TYPE_INDEX]\n", " parameter_names = parsing_dict[event_type]\n", " event_dict = {parameter_name: parameter_value for (parameter_name, parameter_value) in zip(parameter_names, split_event)}\n", "\n", " # recast values\n", " event_dict = recast_values(event_dict=event_dict)\n", "\n", " #\n", " parsed_events_list.append(event_dict)\n", " \n", " return parsed_events_list\n", "\n", "event_based_logging_parameter_list_dict = event_based_logging_population.population_options['event_based_logging_parameter_list_dict']\n", "\n", "parsed_events = parse_events(events_list=combined_events, parsing_dict=event_based_logging_parameter_list_dict)\n", "\n", "for parsed_event in parsed_events:\n", " print(json.dumps(parsed_event, indent=4))\n" ] }, { "cell_type": "markdown", "id": "53add532-71de-4fcb-907a-62b13ff3cef7", "metadata": {}, "source": [ "The parameters contained in each of these dictionaries are described in [the events based logging page in the documentation](https://binary_c.gitlab.io/binary_c-python/event_based_logging_descriptions.html).\n" ] }, { "cell_type": "markdown", "id": "c533e011-89ef-42b6-860b-d7c282aa7bfa", "metadata": {}, "source": [ "The next step would be to run a larger population and log all the events of interest. We can then use the UUID's to cross match certain different events with each other and perform (complex) queries to select e.g. the BHBH DCO_formation events that have experienced at least one pulsational pair-instability supernova but have not undergone and unstable mass transfer." ] } ], "metadata": { "kernelspec": { "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.9.9" } }, "nbformat": 4, "nbformat_minor": 5 }